All Questions
15 questions
0votes
0answers
273views
Correct method to report Randomized Search CV results
I have searched online but I still cannot find a definitive answer on how to "correctly" report the results from hyperparameter tuning a machine learning model; though, this may just be some ...
2votes
2answers
3kviews
Grid_search (RandomizedSearchCV) extremely slow with SVM (SVC)
I'm testing hyperparameters for an SVM, however, when I resort to Gridsearch or RandomizedSearchCV, I haven't been able to get a resolution, because the processing time is exceeding hours. My dataset ...
1vote
1answer
1kviews
How to train multioutput classification with hyperparameter tuning in sklearn?
I am working on a simple multioutput classification problem and noticed this error showing up whenever running the below code: ...
2votes
1answer
2kviews
how to prevent machine crash while searching for hyper parameters of XGBoost with GridSearchCV
I am searching for best hyper parameters of XGBRegressor using GridSearchCV. Here is the code: ...
0votes
1answer
2kviews
MLP classifier Gridsearch CV parameters to tune?
I'm looking to tune the parameters for sklearn's MLP classifier but don't know which to tune/how many options to give them? Example is learning rate. should i give it[.0001,.001,.01,.1,.2,.3]? or is ...
1vote
1answer
165views
Random Forest Model Giving Same Accuracy for different feature sets after tuning
I am having this weird issue and cannot seem to find a solution. I am trying to tune a different random forest model for every different feature-set. Basically from a given data set, I have created 3 ...
4votes
2answers
6kviews
GridSearchCV vs RandomSearchCV and How it works?
GridSearchCV vs RandomSearchCV Can somebody explain in-detailed differences between GridSearchCV and RandomSearchCV? And how the algorithms work under the hood? As per my understanding from the ...
1vote
1answer
9kviews
hyperparameter tuning with validation set
For what I know, and correct me if I am wrong, the use of cross-validation for hyperparameter tuning is not advisable when I have a huge dataset. So, in this case it is better to split the data in ...
1vote
0answers
49views
Minimizing overfitting when doing hyperparameter Tuning
Generaly when using Sklearn's GridSearchCV (or RandomizedGridSearchCV), we get best model with best test score even if the model overfits a little bit. How can we compute generalization error ...
7votes
1answer
4kviews
how to pass parameters over sklearn pipeline's stages?
I'm working on a deep neural model for text classification using Keras. To fine tune some hyperparameters i'm using Keras Wrappers for the Scikit-Learn API. So I builded a Sklearn Pipeline for that: <...
1vote
1answer
313views
Track underlying observation when using GridSearchCV and make_scorer
I'm doing a GridSearchCV, and I've defined a custom function (called custom_scorer below) to optimize for. So the setup is like this: ...
1vote
0answers
371views
How to put KerasClassifier, Hyperopt and Sklearn cross-validation together
I am performing a hyperparameter tuning optimization (hyperopt) tasks with sklearn on a Keras models. I am trying to optimize KerasClassifiers using the Sklearn cross-validation, Some code follows: <...
2votes
1answer
2kviews
XGBOOST (sklearn interface) REGRESSION error
I am trying to run a GRIDSEARCHCV (sklearn) on XGBRegressor. Documentation on the parameter says that if regression, then objective = reg:squarederror.(see https://...
7votes
1answer
4kviews
How to decide how many n_neighbors to consider while implementing LocalOutlierFactor?
I have a data set with rows: 134000 and columns: 200. I am trying to identify the outliers in data set using LocalOutlierFactor from scikit-learn. Although I ...
1vote
1answer
2kviews
Is there any alternative to L-BFGS-B algorithm for hyperparameter optimization in Scikit learn?
The Gaussian process regression can be computed in scikit learn using an object of class GaussianProcessRegressor as: ...